Worst case complexity of direct search
نویسنده
چکیده
In this paper we prove that the broad class of direct-search methods of directional type based on imposing sufficient decrease to accept new iterates shares the worst case complexity bound of steepest descent for the unconstrained minimization of a smooth function, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold squared. In direct-search methods, the objective function is evaluated, at each iteration, at a finite number of points. No derivatives are required. The action of declaring an iteration successful (moving into a point of lower objective function value) or unsuccessful (staying at the same iterate) is based on objective function value comparisons. Some of these methods are directional in the sense of moving along predefined directions along which the objective function will eventually decrease for sufficiently small step sizes. The worst case complexity bounds derived measure the maximum number of iterations as well as the maximum number of objective function evaluations required to find a point with a required norm of the gradient of the objective function, and are proved for such directional direct-search methods when a sufficient decrease condition based on the size of the steps is imposed to accept new iterates.
منابع مشابه
Smoothing and Worst-Case Complexity for Direct-Search Methods in Nonsmooth Optimization
In the context of the derivative-free optimization of a smooth objective function, it has been shown that the worst case complexity of direct-search methods is of the same order as the one of steepest descent for derivative-based optimization, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is proportiona...
متن کاملA second-order globally convergent direct-search method and its worst-case complexity
Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate t...
متن کاملOn the optimal order of worst case complexity of direct search
The worst case complexity of direct-search methods has been recently analyzed when they use positive spanning sets and impose a sufficient decrease condition to accept new iterates. Assuming that the objective function is smooth, it is now known that such methods require at most O(n −2) function evaluations to compute a gradient of norm below ∈ (0, 1), where n is the dimension of the problem. S...
متن کاملComplexity and global rates of trust-region methods based on probabilistic models
Trust-region algorithms have been proved to globally converge with probability one when the accuracy of the trust-region models is imposed with a certain probability conditioning on the iteration history. In this paper, we study their complexity, providing global rates and worst case complexity bounds on the number of iterations (with overwhelmingly high probability), for both first and second ...
متن کاملDirect search based on probabilistic feasible descent for bound and linearly constrained problems
Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region and typically consist of positive generators of approximate tangent cones (which then re...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- EURO J. Computational Optimization
دوره 1 شماره
صفحات -
تاریخ انتشار 2013